Asymptotic Normality of the L K - Error of the Grenander Estimator
نویسنده
چکیده
We investigate the limit behavior of the L k-distance between a decreasing density f and its nonparametric maximum likelihood es-timatorˆfn for k ≥ 1. Due to the inconsistency ofˆfn at zero, the case k = 2.5 turns out to be a kind of transition point. We extend asymp-totic normality of the L1-distance to the L k-distance for 1 ≤ k < 2.5, and obtain the analogous limiting result for a modification of the L k-distance for k ≥ 2.5. Since the L1-distance is the area between f andˆfn, which is also the area between the inverse g of f and the more tractable inverse Un ofˆfn, the problem can be reduced immediately to deriving asymptotic normality of the L1-distance between Un and g. Although we lose this easy correspondence for k > 1, we show that the L k-distance between f andˆfn is asymptotically equivalent to the L k-distance between Un and g. 1. Introduction. Let f be a nonincreasing density with compact support. Without loss of generality, assume this to be the interval [0, 1]. The nonparametric maximum likelihood estimatorˆf n of f was discovered by Grenander [2]. It is defined as the left derivative of the least concave ma-jorant (LCM) of the empirical distribution function F n constructed from a sample X 1 ,. .. , X n from f. Prakasa Rao [11] obtained the earliest result on the asymptotic pointwise behavior of the Grenander estimator. One immediately striking feature of this result is that the rate of convergence is of the same order as the rate of convergence of histogram estimators, and that the asymptotic distribution is not normal. It took much longer to develop distri-butional theory for global measures of performance for this estimator. The first distributional result for a global measure of deviation was the convergence to a normal distribution of the L 1-error mentioned in [3] (see [4] for a
منابع مشابه
Asymptotic Behaviors of Nearest Neighbor Kernel Density Estimator in Left-truncated Data
Kernel density estimators are the basic tools for density estimation in non-parametric statistics. The k-nearest neighbor kernel estimators represent a special form of kernel density estimators, in which the bandwidth is varied depending on the location of the sample points. In this paper, we initially introduce the k-nearest neighbor kernel density estimator in the random left-truncatio...
متن کاملSome Asymptotic Results of Kernel Density Estimator in Length-Biased Sampling
In this paper, we prove the strong uniform consistency and asymptotic normality of the kernel density estimator proposed by Jones [12] for length-biased data.The approach is based on the invariance principle for the empirical processes proved by Horváth [10]. All simulations are drawn for different cases to demonstrate both, consistency and asymptotic normality and the method is illustrated by ...
متن کاملRidge Stochastic Restricted Estimators in Semiparametric Linear Measurement Error Models
In this article we consider the stochastic restricted ridge estimation in semipara-metric linear models when the covariates are measured with additive errors. The development of penalized corrected likelihood method in such model is the basis for derivation of ridge estimates. The asymptotic normality of the resulting estimates are established. Also, necessary and sufficient condition...
متن کاملA Berry-Esseen Type Bound for a Smoothed Version of Grenander Estimator
In various statistical model, such as density estimation and estimation of regression curves or hazard rates, monotonicity constraints can arise naturally. A frequently encountered problem in nonparametric statistics is to estimate a monotone density function f on a compact interval. A known estimator for density function of f under the restriction that f is decreasing, is Grenander estimator, ...
متن کاملAsymptotic normality of Powell’s kernel estimator
In this paper, we establish asymptotic normality of Powell’s kernel estimator for the asymptotic covariance matrix of the quantile regression estimator for both i.i.d. and weakly dependent data. As an application, we derive the optimal bandwidth that minimizes the approximate mean squared error of the kernel estimator.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2005